Mistral 7B V0.1 Flashback V2
MIT
A pre-trained continuation model based on Mistral-7B-v0.1, fine-tuned with 40GB of text data from the Swedish forum Flashback, supporting multilingual generation.
Large Language Model
Transformers Supports Multiple Languages